Goto

Collaborating Authors

 actual vehicle


What the Car? A racing game with a 'complete disregard for actual vehicles'

The Guardian

Imagine a new racing video game. Whatever you've pictured, What the Car? is not it. In a world where racing games pride themselves on the ever-increasing detail and authenticity of their driving experiences, pushing the speedometer towards realism with cutting-edge game engines as well as perfectly simulated motor ones, this is the opposite. This car is literally running around on foot. Described as "an absurdly silly adventure full of racing, laughs, and surprises," What the Car? has you playing as a car with legs, sprinting and climbing through obstacles each more daft than the last, to get to the finish line.


MIXED-SENSE: A Mixed Reality Sensor Emulation Framework for Test and Evaluation of UAVs Against False Data Injection Attacks

Pant, Kartik A., Lin, Li-Yu, Kim, Jaehyeok, Sribunma, Worawis, Goppert, James M., Hwang, Inseok

arXiv.org Artificial Intelligence

We present a high-fidelity Mixed Reality sensor emulation framework for testing and evaluating the resilience of Unmanned Aerial Vehicles (UAVs) against false data injection (FDI) attacks. The proposed approach can be utilized to assess the impact of FDI attacks, benchmark attack detector performance, and validate the effectiveness of mitigation/reconfiguration strategies in single-UAV and UAV swarm operations. Our Mixed Reality framework leverages high-fidelity simulations of Gazebo and a Motion Capture system to emulate proprioceptive (e.g., GNSS) and exteroceptive (e.g., camera) sensor measurements in real-time. We propose an empirical approach to faithfully recreate signal characteristics such as latency and noise in these measurements. Finally, we illustrate the efficacy of our proposed framework through a Mixed Reality experiment consisting of an emulated GNSS attack on an actual UAV, which (i) demonstrates the impact of false data injection attacks on GNSS measurements and (ii) validates a mitigation strategy utilizing a distributed camera network developed in our previous work. Our open-source implementation is available at \href{https://github.com/CogniPilot/mixed\_sense}{\texttt{https://github.com/CogniPilot/mixed\_sense}}


Self-Driving Cars Learn About Road Hazards Through Augmented Reality

#artificialintelligence

For decades, anyone who wanted to know whether a new car was safe to drive could simply put it through its paces, using tests established through trial and error. Such tests might investigate whether the car can take a sharp turn while keeping all four wheels on the road, brake to a stop over a short distance, or survive a collision with a wall while protecting its occupants. But as cars take an ever greater part in driving themselves, such straightforward testing will no longer suffice. We will need to know whether the vehicle has enough intelligence to handle the same kind of driving conditions that humans have always had to manage. To do that, automotive safety-assurance testing has to become less like an obstacle course and more like an IQ test.


What Automotive Companies Are Showing At CES Asia: Electric, Hydrogen, And Autonomy, Of Course

Forbes - Tech

Wandering the halls of CES Asia it's easy to spot the automotive companies, but sometimes you have to look at the logos instead of looking for actual vehicles on display. The booths here at the show in Shanghai, China are full of global and domestic brands showing off everything from self-driving vehicles to futuristic concepts to useful doo-dads for today's cars. The focus of the automakers here at @CESAsia is obviously not on the cars themselves, but on mobility. You hear this a lot, but to see the @Honda booth with no actual cars in it really drives that home. Just because Honda didn't have any actual vehicles on display is not to say that every company wanted to promote mobility ideas over cars.